Learning languages from positive examples with dependencies

نویسندگان

  • Jérôme Besombes
  • Jean-Yves Marion
چکیده

We investigate learning dependency grammar from positive data, in Gold's identification in the limit model. Examples are dependency trees. For this, we introduce reversible lexical dependency grammars which generate a significant class of languages. We have demonstrated that reversible dependency languages are learnable. We provide a-time, in the example size, algorithm. Our objective is to contribute to design and the understanding of formal process of language acquisition. For this, dependency trees play an important role because they naturally appear in every tree phrase structure. From Tesniére (Tesnì ere , 1959) seminal study , and from ideas of Mel ' ˘ cuk (Mel ' ˘ cuk, 1988) , we propose a two tier communication process between two speakers , see Figure 1. Jean transmits a sentence to Marie. At the first stage , Jean generates a structural sentence , like the following dependency tree the rabbit runs fast Then , Jean transforms it into a linear phrase , the rabbit runs fast , and send it to Marie.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Deterministic even Linear Languages From Positive Examples

We consider the problem of learning deterministic even linear languages from positive examples. We show that, for any nonnegative integer k, the class of LR(k) even linear languages is not learnable from positive examples while there is a subclass called US(k), which is a natural subclass of U(A) in the strong sense, learnable from positive examples. Our learning algorithm identifies this subcl...

متن کامل

Statistical Models of Language Learning and Use

This paper summarizes our recent work in developing statistical models of language which are compatible with the kinds of linguistic structures posited by current linguistic theories. Unlike most work in statistical computational linguistics, we are interested in models which are capable of capturing the kinds of non-local context-sensitive dependencies captured by modern theories of syntax. In...

متن کامل

Slavic Languages in Universal Dependencies

Universal Dependencies (UD) is a project that is developing crosslinguistically consistent treebank annotation for many languages, with the goal of facilitating multilingual parser development, cross-lingual learning and linguistic research from a language typology perspective. It is a merger and extension of several previous efforts aimed at finding unified approaches to parts of speech, morph...

متن کامل

Learning Languages from Examples

In this project I explore the machine learning problem of learning formal languages from positive and negative examples. I describe a technique proposed by Kontorovich, Cortes, and Mohri in their paper titled “Learning Linearly Separable Languages.” The premise of this paper is that there exists a family of languages that are linearly separable and efficiently computable under a certain computa...

متن کامل

Morphotactics as Tier-Based Strictly Local Dependencies

It is commonly accepted that morphological dependencies are finite-state in nature. We argue that the upper bound on morphological expressivity is much lower. Drawing on technical results from computational phonology, we show that a variety of morphotactic phenomena are tierbased strictly local and do not fall into weaker subclasses such as the strictly local or strictly piecewise languages. Si...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002